Scaling Law for Recovering the Sparsest Element in a Subspace

نویسنده

  • Laurent Demanet
چکیده

We address the problem of recovering a sparse n-vector within a given subspace. This problem is a subtask of some approaches to dictionary learning and sparse principal component analysis. Hence, if we can prove scaling laws for recovery of sparse vectors, it will be easier to derive and prove recovery results in these applications. In this paper, we present a scaling law for recovering the sparse vector from a subspace that is spanned by the sparse vector and k random vectors. We prove that the sparse vector will be the output to one of n linear programs with high probability if its support size s satisfies s . n/ √ k logn. The scaling law still holds when the desired vector is approximately sparse. To get a single estimate for the sparse vector from the n linear programs, we must select which output is the sparsest. This selection process can be based on any proxy for sparsity, and the specific proxy has the potential to improve or worsen the scaling law. If sparsity is interpreted in an l1/l∞ sense, then the scaling law can not be better than s . n/ √ k. Computer simulations show that selecting the sparsest output in the l1/l2 or thresholded-l0 senses can lead to a larger parameter range for successful recovery than that given by the l1/l∞ sense. sparsity, linear programming, signal recovery, sparse principal component analysis, dictionary learning.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Recovering the Sparsest Element in a Subspace

We address the problem of recovering a sparse n-vector from an arbitrary basis of a subspace spanned by the vector and k random vectors. We prove that the sparse vector will be the output to one of n linear programs with high probability if its support size s satisfies s . n/ √ k logn. The scaling law still holds when the desired vector is approximately sparse. To get a single estimate for the ...

متن کامل

THE SCALING LAW FOR THE DISCRETE KINETIC GROWTH PERCOLATION MODEL

The Scaling Law for the Discrete Kinetic Growth Percolation Model The critical exponent of the total number of finite clusters α is calculated directly without using scaling hypothesis both below and above the percolation threshold pc based on a kinetic growth percolation model in two and three dimensions. Simultaneously, we can calculate other critical exponents β and γ, and show that the scal...

متن کامل

A New Guideline for the Allocation of Multipoles in the Multiple Multipole Method for Two Dimensional Scattering from Dielectrics

A new guideline for proper allocation of multipoles in the multiple multipole method (MMP) is proposed. In an ‘a posteriori’ approach, subspace fitting (SSF) is used to find the best location of multipole expansions for the two dimensional dielectric scattering problem. It is shown that the best location of multipole expansions (regarding their global approximating power) coincides with the med...

متن کامل

Subspace-Sparse Representation

Given an overcomplete dictionary A and a signal b that is a linear combination of a few linearly independent columns of A, classical sparse recovery theory deals with the problem of recovering the unique sparse representation x such that b = Ax. It is known that under certain conditions on A, x can be recovered by the Basis Pursuit (BP) and the Orthogonal Matching Pursuit (OMP) algorithms. In t...

متن کامل

Explicit Stiffness of Tapered and Monosymmetric i Beam-Columns

A formulation for finite element analysis of tapered and  monosymmetric I shaped beam-columns is presented. This is a general way to analyze these types of complex elements. Based upon the formulation, member stiffness matrix is obtained explicitly. The element considered has seven nodal degrees of freedom. In addition, the related stability matrix is found. Numerical studies of the aforementio...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2014